Space Complexity: Measuring Memory Usage $S(N)$
Space complexity $S(N)$ quantifies the amount of memory an algorithm requires to run to completion, relative to the input size $N$.
Crucially, when analyzing efficiency, we focus on Auxiliary Space—the extra space used by the algorithm beyond the storage required for the input itself.
- Input Space: Memory needed to store the input data (usually $O(N)$).
- Auxiliary Space ($S(N)$): Temporary variables, data structures (e.g., queues, hash maps), and recursion stack frames. This is the metric we primarily use for space efficiency comparison.
Common Space Complexity Classes
| Complexity | Description | Example Algorithm |
|---|---|---|
| $O(1)$ | Constant (In-Place) | Two-Pointer Swap (e.g., array reversal) |
| $O(\log N)$ | Logarithmic | QuickSort (recursive calls stack) |
| $O(N)$ | Linear | Breadth-First Search (Queue), Using a Hash Map |
| $O(N^2)$ | Quadratic | Algorithms requiring an $N \times N$ matrix for storage |
Definition: In-Place Algorithms
An algorithm is considered In-Place if it modifies the input data structure directly without requiring temporary storage proportional to the input size $N$.
- Strict $O(1)$ Definition: Requires only a constant amount of auxiliary space, independent of $N$. (e.g., using a fixed number of temporary variables for swaps).
- Practical $O(\log N)$ Inclusion: Algorithms that use recursion (like recursive QuickSort) might use $O(\log N)$ space for the recursion stack. These are often categorized as 'in-place' in a practical context, contrasting them sharply with $O(N)$ solutions.
📝 Knowledge Check
1. An algorithm sorts an array of size $N$ by creating a completely new sorted array. What is its Auxiliary Space Complexity $S(N)$?
2. Why is the auxiliary space of a typical recursive QuickSort considered $O(\log N)$ and not $O(1)$?
3. Which of the following is NOT typically considered an in-place algorithm?